22 research outputs found

    Event Data Definition in LHCb

    Full text link
    We present the approach used for defining the event object model for the LHCb experiment. This approach is based on a high level modelling language, which is independent of the programming language used in the current implementation of the event data processing software. The different possibilities of object modelling languages are evaluated, and the advantages of a dedicated model based on XML over other possible candidates are shown. After a description of the language itself, we explain the benefits obtained by applying this approach in the description of the event model of an experiment such as LHCb. Examples of these benefits are uniform and coherent mapping of the object model to the implementation language across the experiment software development teams, easy maintenance of the event model, conformance to experiment coding rules, etc. The description of the object model is parsed by means of a so called front-end which allows to feed several back-ends. We give an introduction to the model itself and to the currently implemented back-ends which produce information like programming language specific implementations of event objects or meta information about these objects. Meta information can be used for introspection of objects at run-time which is essential for functionalities like object persistency or interactive analysis. This object introspection package for C++ has been adopted by the LCG project as the starting point for the LCG object dictionary that is going to be developed in common for the LHC experiments. The current status of the event object modelling and its usage in LHCb are presented and the prospects of further developments are discussed.Comment: Talk from the 2003 Computing in High Energy and Nuclear Physics (CHEP03), La Jolla, Ca, USA, March 2003, 7 pages, LaTeX, 2 eps figures. PSN MOJT00

    Software Challenges For HL-LHC Data Analysis

    Full text link
    The high energy physics community is discussing where investment is needed to prepare software for the HL-LHC and its unprecedented challenges. The ROOT project is one of the central software players in high energy physics since decades. From its experience and expectations, the ROOT team has distilled a comprehensive set of areas that should see research and development in the context of data analysis software, for making best use of HL-LHC's physics potential. This work shows what these areas could be, why the ROOT team believes investing in them is needed, which gains are expected, and where related work is ongoing. It can serve as an indication for future research proposals and cooperations

    Rúbriques per a l'avaluació de competències

    Get PDF
    Podeu consultar la versió castellana a recurs relacionat.Aquest Quadern és una introducció a l’ús de rúbriques per a l’avaluació dels aprenentatges en l’Educació Superior. Basat en el curs «Elaboració de rúbriques per a l’avaluació de les competències transversals», organitzat per l’Institut de Ciències de l’Educació de la Universitat de Barcelona, ha recollit el resultat de la participació de professorat d’enenyaments diversos de la UB, amb la ánalitat de col·laborar interdisciplinàriament en l’elaboració de rúbriques per avaluar les competències transversals, en primer lloc, i d’elaborar rúbriques sobre les pròpies assignatures en segon lloc.Per tal d’elaborar les rúbriques sobre les competències transversals, s’han definit les competències, s’han establert indicadors i criteris d’avaluació. Considerem que el resultat de l’anàlisi, de les definicions, dels indicadors i de les mateixes rúbriques, no és generalitzable però constitueix un material de primera mà per al desenvolupament i implementació de les rúbriques en l’avaluació de les assignatures a partir dels models oferts

    Rúbricas para la evaluación de competencias

    Get PDF
    Podeu consultar la versió catalana a recurs relacionat.[spa] Este cuaderno es una introducción al uso de rúbricas para la evaluación de los aprendizajes en educación superior. Basado en el curso «Elaboración de rúbricas para la evaluación de competencias transversales», organizado por el Institut de Ciències de l’Educació de la Universitat de Barcelona, recoge el resultado de la participación del profesorado de distintas enseñanzas de la UB, con la finalidad de colaborar interdisciplinariamente en la elaboración de rúbricas para evaluar las competencias transversales, en primer lugar, y de elaborar rúbricas sobre las propias asignaturas, en segundo lugar. Para elaborar las rúbricas sobre las competencias transversales, se han definido las competencias y se han establecido indicadores y criterios de evaluación. Consideramos que el resultado del análisis, de las definiciones, de los indicadores y de las propias rúbricas no es generalizable, pero constituye un material de primera mano para el desarrollo e implementación de las rúbricas en la evaluación de las asignaturas a partir de los modelos ofrecidos.[eng] This book is an introduction to the use of rubrics for the evaluation of learnings in Higher Education. Based on the course «How to prepare rubrics for the evaluation of transversal competences», organized by the Institut of Learning Sciences of the University of Barcelona, and it’s the result of the participation of professorship from different departments of the UB, with the purpose of interdisciplinarily collaborating in the preparation of rubrics to evaluate transversal competences and to elaborate rubrics on the own subjects. To make rubrics about transversal competences, we define them and establish indicators and criteria of evaluation. We think that the result of the analysis, definitions, indicators and the own rubrics cannot be generalized, but it’s a firsthand material for the development and implementation of the rubrics in the evaluation of subjects based at offered models

    1st Developers@CERN Forum

    No full text
    ------------------------------------------------------------------------------------------------------------------------------------------------------------ The PH-SFT group is responsible for the build, test, and deployment of the set of external software packages used by the LHC experiments. This set includes ca. 170 packages including Grid packages and Montecarlo generators provided for different versions. A complete build structure has been established to guarantee the quality of the packages provided by the group. This structure includes an experimental build and three daily nightly builds, each of them dedicated to a specific ROOT version including v6.02, v6.04, and the master. While the former build is dedicated to the test of new packages, versions and dependencies (basically SFT internal used), the three latter ones are the responsible for the deployment to AFS of the set of stable and well tested packages requested by the LHC experiments so they can apply their own builds on top. In all cases, a complete build of these packages is provided for different platforms and compilers. In this talk, we will show the build and test infrastructure we provide to the experiments including a description of the specific tools used for this aim

    Detector Description Framework in LHCb

    No full text
    The Gaudi architecture and framework are designed to provide a common infrastructure and environment for simulation, filtering, reconstruction and analysis applications. In this context, a Detector Description Service was developed in LHCb in order to also provide easy and coherent access to the description of the experimental apparatus. This service centralizes every information about the detector, including geometry, materials, alignment, calibration, structure and controls. From the proof of concept given by the first functional implementation of this service late 2000, the Detector Description Service has grown and has become one of the major components of the LHCb software, shared among all applications, including simulation, reconstruction, analysis and visualization. We describe here the full and functional implementation of the service. We stress the easiness of customization and extension of the detector description by the user, on the seamless integration with condition databases in order to handle time dependent data and on the choice of XML as a persistency back-end for LHCb Detector data. We show how the detector description can be shared but still contains application specific data and keeps at the same time several levels of granularity. We present several external tools which provide additional value to the Detector Description Service like a dedicated, easy to use XML editor and different geometry checkers. We finally give hints on how this service could evolve to be part of a common effort between all LHC experiments that would aim at defining common Detector description tools at the level of the LCG project. 1

    A Software Institute for Data-Intensive Sciences, Joining Computer Science Academia and Natural Science Research

    Get PDF
    With the ever-increasing size of scientific collaborations and complexity of scientific instruments, the software needed to acquire, process and analyze the gathered data is increasing in both complexity and size. Unfortunately the role and career path of scientists and engineers working on software R&D and developing scientific software are neither clearly established nor defined in many fields of natural science. In addition, the exchange of information between scientific software development and computer science departments at universities or computing schools is scattered and fragmented into individual initiatives. To address the above issues we propose a new effort on a European level, which concentrates on strengthening the role of software developers in natural sciences, acts as a hub for the exchange of ideas among different stake-holders in computer science and scientific software and forms a lobbying forum for software engineering in natural sciences on an international level. This contribution discusses in detail the motivation, role and interplay with other initiatives of a “Software Institute for Data-Intensive Sciences”, which is currently being discussed between research institutes, universities and funding agencies in Europe. In addition to the current status, an outlook on future prospects of this initiative will be given
    corecore